Local input-output stability of recurrent networks with time-varying weights

نویسنده

  • Jochen J. Steil
چکیده

We present local conditions for input-output stability of recurrent neural networks with time-varying parameters introduced for instance by noise or on-line adaptation. The conditions guarantee that a network implements a proper mapping from time-varying input to time-varying output functions using a local equilibrium as point of operation. We show how to calculate necessary bounds on the allowed inputs to keep the network in the stable range and apply the method to an example of learning an input-output map implied by the chaotic Roessler attractor.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

Input-Output Stability of Recurrent Neural Networks with Time-Varying Parameters

We provide input-output stability conditions for additive recurrent neural networks regarding them as dynamical operators between their input and output function spaces. The stability analysis is based on methods from non-linear feedback system theory and includes the case of time-varying weights, for instance introduced by on-line adaptation. The results assure that there are regions in weight...

متن کامل

Monotonic Recurrent Bounded Derivative Neural Network

Neural networks applied in control loops and safety-critical domains have to meet hard requirements. First of all, a small approximation error is required, then, the smoothness and the monotonicity of selected input-output relations have to be taken into account and finally, for some processes, time dependencies in time series should be induced into the model. If not then the stability of the c...

متن کامل

Unsupervised learning of an efficient short-term memory network

Learning in recurrent neural networks has been a topic fraught with difficulties and problems. We here report substantial progress in the unsupervised learning of recurrent networks that can keep track of an input signal. Specifically, we show how these networks can learn to efficiently represent their present and past inputs, based on local learning rules only. Our results are based on several...

متن کامل

Recurrent Learning of Input-output Stable Behaviour in Function Space: a Case Study with the Roessler Attractor

We analyse the stability of the input-output behaviour of a recurrent network. It is trained to implement an operator implicitly given by the chaotic dynamics of the Roessler attractor. Two of the attractors coordinate functions are used as network input and the third defines the reference output. Using recently developed new methods we show that the trained network is input-output stable and c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000